113 research outputs found

    Requirements on the LFI On-Board Compression

    Get PDF
    Versione Finale. Final Version.The present document describes the requirements for the compression program and the On-Board compression operations for P LANCK /LFI

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    An origin for small neutrino masses in the NMSSM

    Get PDF
    We consider the Next to Minimal Supersymmetric Standard Model (NMSSM) which provides a natural solution to the so-called mu problem by introducing a new gauge-singlet superfield S. We realize that a new mechanism of neutrino mass suppression, based on the R-parity violating bilinear terms mu_i L_i H_u mixing neutrinos and higgsinos, arises within the NMSSM, offering thus an original solution to the neutrino mass problem (connected to the solution for the mu problem). We generate realistic (Majorana) neutrino mass values without requiring any strong hierarchy amongst the fundamental parameters, in contrast with the alternative models. In particular, the ratio |mu_i/mu| can reach about 10^-1, unlike in the MSSM where it has to be much smaller than unity. We check that the obtained parameters also satisfy the collider constraints and internal consistencies of the NMSSM. The price to pay for this new cancellation-type mechanism of neutrino mass reduction is a certain fine tuning, which get significantly improved in some regions of parameter space. Besides, we discuss the feasibility of our scenario when the R-parity violating bilinear terms have a common origin with the mu term, namely when those are generated via a VEV of the S scalar component from the couplings lambda_i S L_i H_u. Finally, we make comments on some specific phenomenology of the NMSSM in the presence of R-parity violating bilinear terms.Comment: 21 pages, 5 figures, Latex fil

    Modulation of tregs and inkt by fingolimod in multiple sclerosis patients

    Get PDF
    The altered numbers and functions of cells belonging to immunoregulatory cell networks such as T regulatory (Tregs) and invariant Natural Killer T (iNKT) cells have been reported in Multiple Sclerosis (MS), an immune-mediated disease. We aimed to assess the frequencies of Tregs and iNKT cells in MS patients throughout a one-year treatment with fingolimod (FTY) and to correlate immunological data with efficacy and safety data. The percentage of Tregs (defined as Live Dead-CD3 + CD4 + FoxP3 + CD25++/CD127 12 cells) increased steadily throughout the year, while there was no significant difference in the absolute number or percentage of iNKT cells (defined as CD3 + CD14 12CD19 12 V\u3b124-J\u3b118 TCR+ cells). However, out of all the iNKT cells, the CD8+ iNKT and CD4 12CD8 12 double-negative (DN) cell percentages steadily increased, while the CD4+ iNKT cell percentages decreased significantly. The mean percentage of CD8+ T cells at all time-points was lower in patients with infections throughout the study. The numbers and percentages of DN iNKT cells were more elevated, considering all time-points, in patients who presented a clinical relapse. FTY may, therefore, exert its beneficial effect in MS patients through various mechanisms, including the increase in Tregs and in iNKT subsets with immunomodulatory potential such as CD8+ iNKT cells. The occurrence of infections was associated with lower mean CD8+ cell counts during treatment with FTY

    Naturalness and Fine Tuning in the NMSSM: Implications of Early LHC Results

    Get PDF
    We study the fine tuning in the parameter space of the semi-constrained NMSSM, where most soft Susy breaking parameters are universal at the GUT scale. We discuss the dependence of the fine tuning on the soft Susy breaking parameters M_1/2 and m0, and on the Higgs masses in NMSSM specific scenarios involving large singlet-doublet Higgs mixing or dominant Higgs-to-Higgs decays. Whereas these latter scenarios allow a priori for considerably less fine tuning than the constrained MSSM, the early LHC results rule out a large part of the parameter space of the semi-constrained NMSSM corresponding to low values of the fine tuning.Comment: 19 pages, 10 figures, bounds from Susy searches with ~1/fb include

    Novel signatures for vector-like quarks

    Get PDF
    We consider supersymmetric extensions of the standard model with a vector-like doublet (T B) of quarks with charge 2/3 and −1/3, respectively. Compared to non-supersymmetric models, there is a variety of new decay modes for the vector-like quarks, involving the extra scalars present in supersymmetry. The importance of these new modes, yielding multi-top, multi-bottom and also multi-Higgs signals, is highlighted by the analysis of several benchmark scenarios. We show how the triangles commonly used to represent the branching ratios of the ‘standard’ decay modes of the vector-like quarks involving W, Z or Higgs bosons can be generalised to include additional channels. We give an example by recasting the limits of a recent heavy quark search for this more general case.The work of J.A. Aguilar-Saavedra has been supported by MINECO Projects FPA 2016- 78220-C3-1-P and FPA 2013-47836-C3-2-P (including ERDF), and by Junta de Andaluc a Project FQM-101. The work of D.E. L opez-Fogliani has been supported by the Argentinian CONICET. The work of C. Mu~noz has been supported in part by the Programme SEV- 2012-0249 `Centro de Excelencia Severo Ochoa'. D.E. L opez-Fogliani and C. Mu~noz also acknowledge the support of the Spanish grant FPA2015-65929-P (MINECO/FEDER, UE), and MINECO's Consolider-Ingenio 2010 Programme under grant MultiDark CSD2009- 00064

    Biogenic weathering bridges the nutrient gap in pristine ecosystems - a global comparison

    Get PDF
    In many pristine ecosystems there seems to be negative nutrient budget existent, meaning that export exceeds the input received by aeolian deposition and physico-chemical weathering. Such ecosystems should degrade rather quickly, but are often found surprisingly stable on the long run. Our hypothesis was that this nutrient gap is an artefact caused by not considering the contribution of photoassimilatory-mediated biogenic weathering to the overall nutrient input, which might constitute an additional, energetically directed and demand driven pathway. Here, we firstly evaluated the evolution of mutualistic biogenic weathering along an Antarctic chronosequence and secondly compared the biogenic weathering rates under mycorrhized ecosystems over a global gradient of contrasting states of soil development. We found the ability to perform biogenic weathering increasing along its evolutionary development in photoautotroph-symbiont interaction and furthermore a close relation between fungal biogenic weathering and available potassium across all 16 forested sites in the study, regardless of the dominant mycorrhiza type (AM or EM), climate, and plant-species composition. Our results point towards a general alleviation of nutrient limitation at ecosystem scale via directional, energy driven and on-demand biogenic weathering

    Zodiacal Light Emission in the PLANCK mission

    Full text link
    The PLANCK satellite, scheduled for launch in 2007, will produce a set of all sky maps in nine frequency bands spanning from 30 GHz to 857 GHz, with an unprecedented sensitivity and resolution. Planets, minor bodies and diffuse interplanetary dust will contribute to the (sub)mm sky emission observed by PLANCK, representing a source of foreground contamination to be removed before extracting the cosmological information. The aim of this paper is to assess the expected level of contamination in the survey of the forthcoming PLANCK mission. Starting from existing far-infrared (far-IR) models of the Zodiacal Light Emission (ZLE), we present a new method to simulate the time-dependent level of contamination from ZLE at PLANCK frequencies. We studied the possibility of PLANCK to detect and separate the ZLE contribution from the other astrophysical signals. We discuss the conditions in which PLANCK will be able to increase the existing information on the ZLE and IDP physical properties.Comment: Two COlumns, A&A Style, 19 pages, 12 figures, 4 tables, accepted for the pubblication in A&A - 27 Jan 2006, new version: one reference added and some typos correcte

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Off-line radiometric analysis of Planck/LFI data

    Get PDF
    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
    • 

    corecore